Skip to content

fix: make heavy ML dependencies optional for lightweight installs#57

Merged
abrichr merged 2 commits intomainfrom
fix/optional-heavy-deps
Mar 19, 2026
Merged

fix: make heavy ML dependencies optional for lightweight installs#57
abrichr merged 2 commits intomainfrom
fix/optional-heavy-deps

Conversation

@abrichr
Copy link
Copy Markdown
Member

@abrichr abrichr commented Mar 19, 2026

Summary

  • Move torch, torchvision, bitsandbytes, peft, and transformers from required dependencies to [project.optional-dependencies.training]
  • Wrap all top-level imports of these packages in try/except ImportError with clear error messages when users try to instantiate training classes without them
  • Make training/grpo/__init__.py lazy-load the torch-dependent GRPOTrainer module

Motivation

The Wright worker needs to install openadapt-evals (which depends on openadapt-ml), but doesn't need local model training. The current required deps pull in torch (873MB), torchvision, bitsandbytes, peft, and transformers, making the install massive and slow for lightweight consumers.

After this change:

  • pip install openadapt-ml installs only lightweight deps (~anthropic, click, pillow, pydantic-settings, etc.)
  • pip install openadapt-ml[training] installs everything needed for local model training/inference
  • import openadapt_ml works without torch installed
  • Attempting to instantiate BaseVLMAdapter, QwenVLAdapter, DummyAdapter, or NextActionDataset without torch raises a clear ImportError with install instructions

Test plan

  • Verify python -c "import openadapt_ml" works without torch
  • Verify python -c "from openadapt_ml.datasets.next_action import build_next_action_sft_samples" works without torch
  • Verify pip install openadapt-ml[training] pulls in torch and training works as before
  • Verify clear error messages when trying to use training classes without torch

🤖 Generated with Claude Code

Move torch, torchvision, bitsandbytes, peft, and transformers from
required dependencies to [project.optional-dependencies.training].
Wrap all top-level imports of these packages in try/except ImportError
so the package can be imported without them installed.

This unblocks lightweight consumers (e.g. Wright worker installing
openadapt-evals) that don't need local model training/inference.
Users who need training can install with: pip install openadapt-ml[training]

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@abrichr abrichr merged commit aa954ba into main Mar 19, 2026
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant